• Introduce students to R through and R studio, a Graphic User Interface (GUI) • REVIEW basic principle of data management, data exploration and visualization, and basic statistical tests including t-tests, linear models, and regression. • Introduce the concept of scripting to conduct statistical analyses. • Introduce how to handle spatial data (Rasters, Vectors) using the R packages sp, raster.
R is a free, open-source statistics software program. R is a command-line driven high-level software language, meaning a user enters code into the R console to make calculations, conduct analyses, etc. Unfortunately, this style of software can be daunting for beginners, who are not familiar with the commands or syntax of R – and because R is free the help files can sometimes be less than helpful, but they are improving. R Commander smoothes much of this confusion and aggravation by providing a point-and-click window environment for new users to begin performing basic analyses. In addition to these user-friendly windows, you can also enter R code directly into the script window in R commander and click the submit button to run it. The script window serves as the R console. Note, however, this friendly environment may also sacrifice some of the flexibility of using the full R language and may become frustrating as you learn the R language. In summary, there are 3 main reasons why the ecological world is moving to R; 1) It is free, and has a growing and loyal on-line community of software developers continually improving and adding to its functionality – chances are, if you have to do a customized analysis, someone has already built a package to download and use. 2) Its becoming the statistical software of choice of ecologists and statisticians worldwide, and will carry your career for at least the next decade. 3) Most importantly, though, it is a fully customizable software platform (not a programming language) for conducting any analysis you need in your graduate research.
R Resources • Website - http://www.r-project.org/index.html • The R-journal - http://journal.r-project.org/ • Internet searching - http://www.rseek.org/
R studio https://www.rstudio.com Another of Hadley Wickham’s creations (along with ggplot2, Rmarkdown, etc), R studio is a RStudio is an integrated development environment (IDE) for R.
R.Version()
## $platform
## [1] "x86_64-w64-mingw32"
##
## $arch
## [1] "x86_64"
##
## $os
## [1] "mingw32"
##
## $crt
## [1] "ucrt"
##
## $system
## [1] "x86_64, mingw32"
##
## $status
## [1] ""
##
## $major
## [1] "4"
##
## $minor
## [1] "2.2"
##
## $year
## [1] "2022"
##
## $month
## [1] "10"
##
## $day
## [1] "31"
##
## $`svn rev`
## [1] "83211"
##
## $language
## [1] "R"
##
## $version.string
## [1] "R version 4.2.2 (2022-10-31 ucrt)"
##
## $nickname
## [1] "Innocent and Trusting"
citation()
##
## To cite R in publications use:
##
## R Core Team (2022). R: A language and environment for statistical
## computing. R Foundation for Statistical Computing, Vienna, Austria.
## URL https://www.R-project.org/.
##
## A BibTeX entry for LaTeX users is
##
## @Manual{,
## title = {R: A Language and Environment for Statistical Computing},
## author = {{R Core Team}},
## organization = {R Foundation for Statistical Computing},
## address = {Vienna, Austria},
## year = {2022},
## url = {https://www.R-project.org/},
## }
##
## We have invested a lot of time and effort in creating R, please cite it
## when using it for data analysis. See also 'citation("pkgname")' for
## citing R packages.
First we will refresh our data exploration, summary statistic skills, and generalized linear modeling skills to review some basic foundations of statistics and R together.
We will be using data on track counts of mountain lions/cougars in and around Banff National Park Alberta in the data file called cougar.csv
The file cougar.csv contains data on cougar track counts from the Bow Valley of Banff National Park. Over 5 years, multiple cougars were snow tracked through various wildlife corridors surrounding the townsite of Banff (Duke et al., 2001), and the number of counts of tracks of cougars within 30m2 pixels was counted in one wildlife corridor. This resulted in some cells having multiple counts of cougar tracks and others having no counts. Park Managers wanted to know what habitat factors cougars needed for corridor use, and the dataset includes information about the count of cougar tracks and associated spatial covariates including slope, distance to cover, distance to roads and distance to trails.
Duke, D. L., M. Hebblewhite, P. C. Paquet, C. Callaghan, and M. Percy.
2001. Restoration of a large carnivore corridor in Banff National Park.
Pages 261-275 in D. S. Maehr, R. F. Noss, and J. L. Larkin, editors.
Large mammal restoration: ecological and sociological challenges in the
21st century. Island Press, Washington.
Import the data csv
cougar <- read.csv("Data/cougar.csv", header=TRUE, sep=",", na.strings="NA", dec=".", strip.white=TRUE)
head(cougar)
## Use UseNonUse Slope CoverDist Roads AllTrails
## 1 0 0 9.462322 0 787 228
## 2 0 0 39.856209 0 690 247
## 3 0 0 0.000000 7 150 153
## 4 0 0 11.386165 0 30 671
## 5 0 0 10.555381 0 457 42
## 6 0 0 7.653049 0 335 192
str(cougar)
## 'data.frame': 726 obs. of 6 variables:
## $ Use : int 0 0 0 0 0 0 0 0 0 0 ...
## $ UseNonUse: int 0 0 0 0 0 0 0 0 0 0 ...
## $ Slope : num 9.46 39.86 0 11.39 10.56 ...
## $ CoverDist: int 0 0 7 0 0 0 0 0 16 0 ...
## $ Roads : int 787 690 150 30 457 335 741 711 630 283 ...
## $ AllTrails: int 228 247 153 671 42 192 60 42 300 134 ...
table(cougar$Use, cougar$UseNonUse)
##
## 0 1
## 0 363 0
## 1 0 31
## 2 0 34
## 3 0 28
## 4 0 32
## 5 0 28
## 6 0 121
## 7 0 16
## 8 0 14
## 9 0 10
## 10 0 12
## 11 0 8
## 12 0 18
## 13 0 4
## 14 0 5
## 15 0 1
## 19 0 1
# Cougar data management
## Let us convert the USENONUSE column to a factor for future analysis, and store it as as a factorUSE
cougar$factorUSE <- as.factor(cougar$UseNonUse)
Lets calculate some basic summary statistics for the data set…
summary(cougar)
## Use UseNonUse Slope CoverDist
## Min. : 0.000 Min. :0.0 Min. : 0.000 Min. : 0.000
## 1st Qu.: 0.000 1st Qu.:0.0 1st Qu.: 0.000 1st Qu.: 0.000
## Median : 0.500 Median :0.5 Median : 2.134 Median : 0.000
## Mean : 2.853 Mean :0.5 Mean : 8.111 Mean : 2.959
## 3rd Qu.: 6.000 3rd Qu.:1.0 3rd Qu.:12.773 3rd Qu.: 0.000
## Max. :19.000 Max. :1.0 Max. :47.730 Max. :53.000
## Roads AllTrails factorUSE
## Min. : 0.0 Min. : 0.0 0:363
## 1st Qu.: 260.5 1st Qu.: 67.0 1:363
## Median : 450.0 Median :175.0
## Mean : 465.5 Mean :224.8
## 3rd Qu.: 641.0 3rd Qu.:323.0
## Max. :1400.0 Max. :899.0
# ... by used and unused locations
install.packages("tidyverse")
## package 'tidyverse' successfully unpacked and MD5 sums checked
##
## The downloaded binary packages are in
## C:\Users\Administrator.KJLWS11\AppData\Local\Temp\Rtmp84rxPr\downloaded_packages
library(tidyverse)
## ── Attaching packages ─────────────────────────────────────── tidyverse 1.3.2 ──
## ✔ ggplot2 3.4.0 ✔ purrr 1.0.0
## ✔ tibble 3.1.8 ✔ dplyr 1.0.10
## ✔ tidyr 1.2.1 ✔ stringr 1.5.0
## ✔ readr 2.1.3 ✔ forcats 0.5.2
## ── Conflicts ────────────────────────────────────────── tidyverse_conflicts() ──
## ✖ dplyr::filter() masks stats::filter()
## ✖ dplyr::lag() masks stats::lag()
cougar_df <- as_tibble(cougar)
byUse <- group_by(cougar_df, UseNonUse)
summarise(byUse, slope = mean(Slope))
## # A tibble: 2 × 2
## UseNonUse slope
## <int> <dbl>
## 1 0 10.3
## 2 1 5.93
summarise(byUse, DistTrails = mean(AllTrails))
## # A tibble: 2 × 2
## UseNonUse DistTrails
## <int> <dbl>
## 1 0 244.
## 2 1 206.
summarise(byUse, DistCover = mean(CoverDist))
## # A tibble: 2 × 2
## UseNonUse DistCover
## <int> <dbl>
## 1 0 3.84
## 2 1 2.07
summarise(byUse, DistRoads = mean(Roads))
## # A tibble: 2 × 2
## UseNonUse DistRoads
## <int> <dbl>
## 1 0 474.
## 2 1 457.
Lets visualize this differences with overlaid histograms for each variable. The multhist function from the plotrix package does a nice job of doing this.
install.packages("plotrix")
## package 'plotrix' successfully unpacked and MD5 sums checked
##
## The downloaded binary packages are in
## C:\Users\Administrator.KJLWS11\AppData\Local\Temp\Rtmp84rxPr\downloaded_packages
library(plotrix)
# Use a 2x2 plotting matrix to see all of the histograms at once
par(mfrow = c(2,2))
multhist(list(cougar$AllTrails[cougar$factorUSE==1],cougar$AllTrails[cougar$factorUSE==0]), freq = TRUE, main = "Trails")
# I chose to put a legend in the upper right hand graph.
# That's what the additional arguments in the line below specify.
multhist(list(cougar$CoverDist[cougar$factorUSE==1],cougar$CoverDist[cougar$factorUSE==0]), freq = TRUE, main = "Cover Distance", legend.text = c("Used", "Unused"), args.legend = list(bty = "n"))
multhist(list(cougar$Roads[cougar$factorUSE==1],cougar$Roads[cougar$factorUSE==0]), freq = TRUE, main = "Roads")
multhist(list(cougar$Slope[cougar$factorUSE==1],cougar$Slope[cougar$factorUSE==0]), freq = TRUE, main = "Slope")
Box plots also do a good job of visualizing these differences. These are plotted in a similar fashion as above.
par(mfrow= c(2,2))
boxplot(AllTrails~factorUSE, ylab="Distance (m)", xlab="Used",main = "Trails", data=cougar)
boxplot(CoverDist~factorUSE, ylab="Distance (m)", xlab="Used", main = "Cover", data=cougar)
boxplot(Roads~factorUSE, ylab="Distance (m)", xlab="Used",main = "Roads", data=cougar)
boxplot(Slope~factorUSE, ylab="Slope", xlab="Used", main = "Slope", data=cougar)
Test for significant differences in each of the covariates
t.test(AllTrails~factorUSE, alternative='two.sided', conf.level=.95,
var.equal=FALSE, data=cougar)
##
## Welch Two Sample t-test
##
## data: AllTrails by factorUSE
## t = 2.6896, df = 715.58, p-value = 0.00732
## alternative hypothesis: true difference in means between group 0 and group 1 is not equal to 0
## 95 percent confidence interval:
## 10.13185 64.90396
## sample estimates:
## mean in group 0 mean in group 1
## 243.5399 206.0220
t.test(CoverDist~factorUSE, alternative='two.sided', conf.level=.95,
var.equal=FALSE, data=cougar)
##
## Welch Two Sample t-test
##
## data: CoverDist by factorUSE
## t = 3.3686, df = 604.85, p-value = 0.0008037
## alternative hypothesis: true difference in means between group 0 and group 1 is not equal to 0
## 95 percent confidence interval:
## 0.7374901 2.7997000
## sample estimates:
## mean in group 0 mean in group 1
## 3.842975 2.074380
t.test(Roads~factorUSE, alternative='two.sided', conf.level=.95,
var.equal=FALSE, data=cougar)
##
## Welch Two Sample t-test
##
## data: Roads by factorUSE
## t = 0.86435, df = 679.54, p-value = 0.3877
## alternative hypothesis: true difference in means between group 0 and group 1 is not equal to 0
## 95 percent confidence interval:
## -22.24071 57.22143
## sample estimates:
## mean in group 0 mean in group 1
## 474.2039 456.7135
t.test(Slope~factorUSE, alternative='two.sided', conf.level=.95,
var.equal=FALSE, data=cougar)
##
## Welch Two Sample t-test
##
## data: Slope by factorUSE
## t = 5.4158, df = 627.87, p-value = 8.7e-08
## alternative hypothesis: true difference in means between group 0 and group 1 is not equal to 0
## 95 percent confidence interval:
## 2.774890 5.931965
## sample estimates:
## mean in group 0 mean in group 1
## 10.287312 5.933884
First, letes create a data set of only the used locations
USEonly <- subset(cougar, subset=UseNonUse == 1)
# Construct linear models of track counts as a function of 4 covariates
# Distance to trails model
trails <- glm(Use ~ AllTrails, family=gaussian(identity), data=USEonly)
summary(trails)
##
## Call:
## glm(formula = Use ~ AllTrails, family = gaussian(identity), data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -5.120 -2.106 -0.033 1.054 13.009
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 6.1910009 0.2541896 24.356 <2e-16 ***
## AllTrails -0.0023578 0.0009354 -2.521 0.0121 *
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 9.972573)
##
## Null deviance: 3663.5 on 362 degrees of freedom
## Residual deviance: 3600.1 on 361 degrees of freedom
## AIC: 1869
##
## Number of Fisher Scoring iterations: 2
# Slope model
slope <- glm(Use ~ Slope, family=gaussian(identity), data=USEonly)
summary(slope)
##
## Call:
## glm(formula = Use ~ Slope, family = gaussian(identity), data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -4.7694 -2.6853 0.2811 0.3147 13.3114
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 5.685274 0.204401 27.81 <2e-16 ***
## Slope 0.003364 0.019816 0.17 0.865
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 10.14728)
##
## Null deviance: 3663.5 on 362 degrees of freedom
## Residual deviance: 3663.2 on 361 degrees of freedom
## AIC: 1875.3
##
## Number of Fisher Scoring iterations: 2
# Distance to cover model
cover <- glm(Use ~ CoverDist, family=gaussian(identity), data=USEonly)
summary(cover)
##
## Call:
## glm(formula = Use ~ CoverDist, family = gaussian(identity), data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -4.8123 -1.9863 0.1877 1.0137 13.1877
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 5.81233 0.17904 32.464 <2e-16 ***
## CoverDist -0.05163 0.03162 -1.633 0.103
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 10.07372)
##
## Null deviance: 3663.5 on 362 degrees of freedom
## Residual deviance: 3636.6 on 361 degrees of freedom
## AIC: 1872.6
##
## Number of Fisher Scoring iterations: 2
# Distance to roads model
roads <- glm(Use ~ Roads, family=gaussian(identity), data=USEonly)
summary(roads)
##
## Call:
## glm(formula = Use ~ Roads, family = gaussian(identity), data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -4.8972 -2.5408 0.2199 0.5703 13.2917
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 5.9166147 0.3654078 16.19 <2e-16 ***
## Roads -0.0004628 0.0007115 -0.65 0.516
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 10.13621)
##
## Null deviance: 3663.5 on 362 degrees of freedom
## Residual deviance: 3659.2 on 361 degrees of freedom
## AIC: 1874.9
##
## Number of Fisher Scoring iterations: 2
But these tests require the assumption of normality for the response variable, use. Is that reasonable? Lets test track counts for normality
# Visualize with a histogram
par(mfrow= c(1,1))
hist(USEonly$Use, scale="frequency", breaks="Sturges", col="darkgray")
## Warning in plot.window(xlim, ylim, "", ...): "scale" is not a graphical
## parameter
## Warning in title(main = main, sub = sub, xlab = xlab, ylab = ylab, ...): "scale"
## is not a graphical parameter
## Warning in axis(1, ...): "scale" is not a graphical parameter
## Warning in axis(2, at = yt, ...): "scale" is not a graphical parameter
## not really that normal
shapiro.test(USEonly$Use)
##
## Shapiro-Wilk normality test
##
## data: USEonly$Use
## W = 0.92147, p-value = 7.473e-13
# Visualize with a histogram
clearly, the answer is NO. This means we need to consider OTHER forms of generalized linear models, or transformation of the response variable, here, USE.
Lets re-fit linear models with a ln transform, first make the ln transform, and then refit the linear models with the transformed use
USEonly$lnUSE <- with(USEonly, log(Use))
hist(USEonly$lnUSE) ## a bit more normal
### Now re-fit the models
# Distance to trails model
ln.trails <- glm(lnUSE ~ AllTrails, family=gaussian(identity), data=USEonly)
summary(ln.trails)
##
## Call:
## glm(formula = lnUSE ~ AllTrails, family = gaussian(identity),
## data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.6225 -0.2778 0.1812 0.3611 1.3439
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.6345317 0.0541846 30.166 <2e-16 ***
## AllTrails -0.0003999 0.0001994 -2.006 0.0456 *
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 0.4531516)
##
## Null deviance: 165.41 on 362 degrees of freedom
## Residual deviance: 163.59 on 361 degrees of freedom
## AIC: 746.82
##
## Number of Fisher Scoring iterations: 2
# Slope model
ln.slope <- glm(lnUSE ~ Slope, family=gaussian(identity), data=USEonly)
summary(ln.slope)
##
## Call:
## glm(formula = lnUSE ~ Slope, family = gaussian(identity), data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.5607 -0.4340 0.2311 0.2707 1.3852
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.56068 0.04343 35.938 <2e-16 ***
## Slope -0.00144 0.00421 -0.342 0.732
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 0.4580528)
##
## Null deviance: 165.41 on 362 degrees of freedom
## Residual deviance: 165.36 on 361 degrees of freedom
## AIC: 750.72
##
## Number of Fisher Scoring iterations: 2
# Distance to cover model
ln.cover <- glm(lnUSE ~ CoverDist, family=gaussian(identity), data=USEonly)
summary(ln.cover)
##
## Call:
## glm(formula = lnUSE ~ CoverDist, family = gaussian(identity),
## data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.5792 -0.2721 0.2126 0.3667 1.4028
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.57917 0.03799 41.573 <2e-16 ***
## CoverDist -0.01303 0.00671 -1.942 0.0529 .
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 0.4534625)
##
## Null deviance: 165.41 on 362 degrees of freedom
## Residual deviance: 163.70 on 361 degrees of freedom
## AIC: 747.07
##
## Number of Fisher Scoring iterations: 2
# Distance to roads model
ln.roads <- glm(lnUSE ~ Roads, family=gaussian(identity), data=USEonly)
summary(ln.roads)
##
## Call:
## glm(formula = lnUSE ~ Roads, family = gaussian(identity), data = USEonly)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.5737 -0.4394 0.2330 0.2563 1.3926
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 1.535e+00 7.768e-02 19.76 <2e-16 ***
## Roads 3.781e-05 1.513e-04 0.25 0.803
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for gaussian family taken to be 0.4581219)
##
## Null deviance: 165.41 on 362 degrees of freedom
## Residual deviance: 165.38 on 361 degrees of freedom
## AIC: 750.78
##
## Number of Fisher Scoring iterations: 2
Next, lets switch back to the complete data set, and then re-do the Shapiro-Wilk Test and make another histogram for all of the data.
shapiro.test(cougar$Use)
##
## Shapiro-Wilk normality test
##
## data: cougar$Use
## W = 0.78045, p-value < 2.2e-16
hist(cougar$Use, scale="frequency", breaks="Sturges", col="darkgray")
## Warning in plot.window(xlim, ylim, "", ...): "scale" is not a graphical
## parameter
## Warning in title(main = main, sub = sub, xlab = xlab, ylab = ylab, ...): "scale"
## is not a graphical parameter
## Warning in axis(1, ...): "scale" is not a graphical parameter
## Warning in axis(2, at = yt, ...): "scale" is not a graphical parameter
Clearly these data are still not normal, and so it might be useful to
think of categorizing these data as USED or NOT USED, in a binary
fashion, and using logistic regression of the use = 1 and not use = 0.
Thus, next we will use logistic regression to model the log-odds of USE
for each of the 4 covariates
# Trails model
logitTrails <- glm(UseNonUse ~ AllTrails, family=binomial(logit), data=cougar)
summary(logitTrails)
##
## Call:
## glm(formula = UseNonUse ~ AllTrails, family = binomial(logit),
## data = cougar)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.2805 -1.1920 0.0936 1.1450 1.4275
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 0.2392204 0.1163257 2.056 0.03974 *
## AllTrails -0.0010675 0.0004007 -2.664 0.00772 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 1006.45 on 725 degrees of freedom
## Residual deviance: 999.22 on 724 degrees of freedom
## AIC: 1003.2
##
## Number of Fisher Scoring iterations: 4
# Slope model
logitSlope <- glm(UseNonUse ~ Slope, family=binomial(logit), data=cougar)
summary(logitSlope)
##
## Call:
## glm(formula = UseNonUse ~ Slope, family = binomial(logit), data = cougar)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.3078 -1.2388 0.2123 1.0679 1.8429
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 0.301367 0.093849 3.211 0.00132 **
## Slope -0.038241 0.007427 -5.149 2.62e-07 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 1006.45 on 725 degrees of freedom
## Residual deviance: 977.23 on 724 degrees of freedom
## AIC: 981.23
##
## Number of Fisher Scoring iterations: 4
# Cover model
logitCover <- glm(UseNonUse~ CoverDist, family=binomial(logit), data=cougar)
summary(logitCover)
##
## Call:
## glm(formula = UseNonUse ~ CoverDist, family = binomial(logit),
## data = cougar)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.2233 -1.2233 0.3021 1.1322 1.8036
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 0.10739 0.08089 1.328 0.18428
## CoverDist -0.03788 0.01175 -3.224 0.00126 **
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 1006.45 on 725 degrees of freedom
## Residual deviance: 994.78 on 724 degrees of freedom
## AIC: 998.78
##
## Number of Fisher Scoring iterations: 4
# Roads model
logitRoads <- glm(UseNonUse ~ Roads, family=binomial(logit), data=cougar)
summary(logitRoads)
##
## Call:
## glm(formula = UseNonUse ~ Roads, family = binomial(logit), data = cougar)
##
## Deviance Residuals:
## Min 1Q Median 3Q Max
## -1.22439 -1.18003 0.02292 1.17627 1.24756
##
## Coefficients:
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 0.1098329 0.1471202 0.747 0.455
## Roads -0.0002360 0.0002729 -0.865 0.387
##
## (Dispersion parameter for binomial family taken to be 1)
##
## Null deviance: 1006.4 on 725 degrees of freedom
## Residual deviance: 1005.7 on 724 degrees of freedom
## AIC: 1009.7
##
## Number of Fisher Scoring iterations: 3
install.packages("ggplot2")
## Warning: package 'ggplot2' is in use and will not be installed
library(ggplot2)
ggplot(cougar, aes(x=Slope, y=UseNonUse)) + geom_rug() + stat_smooth(method="glm", method.args=list(family="binomial"))
## `geom_smooth()` using formula = 'y ~ x'
ggplot(cougar, aes(x=CoverDist, y=UseNonUse)) + geom_rug() + stat_smooth(method="glm", method.args=list(family="binomial"))
## `geom_smooth()` using formula = 'y ~ x'
So, to review, we have reviewed some basic data visualization steps (boxplots, histograms, etc), basic summary statistics, basic statistical analyses (t-tests, tests for normality, generalized linear models), and considered a simple dataset with different ways of thinking about animal use of space and 4 spatial covariates.
But, how did we build our dataset of things like distance to cover, distance to trails, etc? Next, we will move to understanding and exploring spatial data in R.
This semester we will use this nifty little wrapper package Dan Eacker adapted from here: https://gist.github.com/stevenworthington/3178163 ## Define function to install and load required packages
ipak <- function(pkg){
new.pkg <- pkg[!(pkg %in% installed.packages()[, "Package"])]
if (length(new.pkg))
install.packages(new.pkg, dependencies = TRUE)
sapply(pkg, require, character.only = TRUE)
}
#load or install these packages:
packages <- c("ks", "lattice", "adehabitatHR", "maptools", "foreign", "rgdal", "sp", "raster","plot3D","rasterVis", "colorRamps","rgeos")
#run function to install packages - e.g., library command
ipak(packages)
## Loading required package: ks
## Loading required package: lattice
## Loading required package: adehabitatHR
## Loading required package: sp
## Loading required package: deldir
## deldir 1.0-6 Nickname: "Mendacious Cosmonaut"
##
## The syntax of deldir() has had an important change.
## The arguments have been re-ordered (the first three
## are now "x, y, z") and some arguments have been
## eliminated. The handling of the z ("tags")
## argument has been improved.
##
## The "dummy points" facility has been removed.
## This facility was a historical artefact, was really
## of no use to anyone, and had hung around much too
## long. Since there are no longer any "dummy points",
## the structure of the value returned by deldir() has
## changed slightly. The arguments of plot.deldir()
## have been adjusted accordingly; e.g. the character
## string "wpoints" ("which points") has been
## replaced by the logical scalar "showpoints".
## The user should consult the help files.
## Loading required package: ade4
## Loading required package: adehabitatMA
## Registered S3 methods overwritten by 'adehabitatMA':
## method from
## print.SpatialPixelsDataFrame sp
## print.SpatialPixels sp
## Loading required package: adehabitatLT
## Loading required package: CircStats
## Loading required package: MASS
##
## Attaching package: 'MASS'
## The following object is masked from 'package:dplyr':
##
## select
## Loading required package: boot
##
## Attaching package: 'boot'
## The following object is masked from 'package:lattice':
##
## melanoma
##
## Attaching package: 'adehabitatLT'
## The following object is masked from 'package:dplyr':
##
## id
## Loading required package: maptools
## Checking rgeos availability: TRUE
## Please note that 'maptools' will be retired during 2023,
## plan transition at your earliest convenience;
## some functionality will be moved to 'sp'.
## Loading required package: foreign
## Loading required package: rgdal
## Please note that rgdal will be retired during 2023,
## plan transition to sf/stars/terra functions using GDAL and PROJ
## at your earliest convenience.
## See https://r-spatial.org/r/2022/04/12/evolution.html and https://github.com/r-spatial/evolution
## rgdal: version: 1.6-3, (SVN revision 1187)
## Geospatial Data Abstraction Library extensions to R successfully loaded
## Loaded GDAL runtime: GDAL 3.5.2, released 2022/09/02
## Path to GDAL shared files: C:/Program Files/R/R-4.2.2/library/rgdal/gdal
## GDAL binary built with GEOS: TRUE
## Loaded PROJ runtime: Rel. 8.2.1, January 1st, 2022, [PJ_VERSION: 821]
## Path to PROJ shared files: C:/Program Files/R/R-4.2.2/library/rgdal/proj
## PROJ CDN enabled: FALSE
## Linking to sp version:1.5-1
## To mute warnings of possible GDAL/OSR exportToProj4() degradation,
## use options("rgdal_show_exportToProj4_warnings"="none") before loading sp or rgdal.
## Loading required package: raster
##
## Attaching package: 'raster'
## The following object is masked from 'package:MASS':
##
## select
## The following object is masked from 'package:dplyr':
##
## select
## Loading required package: plot3D
## Loading required package: rasterVis
## Loading required package: colorRamps
## Loading required package: rgeos
## rgeos version: 0.6-1, (SVN revision 692)
## GEOS runtime version: 3.9.3-CAPI-1.14.3
## Please note that rgeos will be retired during 2023,
## plan transition to sf functions using GEOS at your earliest convenience.
## GEOS using OverlayNG
## Linking to sp version: 1.5-1
## Polygon checking: TRUE
## ks lattice adehabitatHR maptools foreign rgdal
## TRUE TRUE TRUE TRUE TRUE TRUE
## sp raster plot3D rasterVis colorRamps rgeos
## TRUE TRUE TRUE TRUE TRUE TRUE
First set working directory, then read in shapefiles
# reading in shapefiles (raster package)
elc_habitat<-shapefile("Data/elc_habitat.shp")
humanaccess<-shapefile("Data/humanacess.shp")
mcp2<-shapefile("Data/mcp2.shp")
wolfyht<-shapefile("Data/wolfyht.shp")
# make a very basic plot of shapefile after resetting graphical parameters
par(mfrow= c(1,1))
plot(elc_habitat)
plot(wolfyht)
# look at the class of the shapefile
class(elc_habitat)
## [1] "SpatialPolygonsDataFrame"
## attr(,"package")
## [1] "sp"
# look at structure of shapefile
#str(elc_habitat)
# look at first 20 rows of data for shapefile
head(elc_habitat@data, n=20)
## ID ECO_NUM CODE ECOREGION ECOSECTION SLOPE COMPLEX MODIFIER_1
## 1 1 154 WF3 Upper Subalpine Wildflower 9 <NA> B
## 2 2 101 PL1 Upper Subalpine Peyto Lake 8 <NA> <NA>
## 3 3 152 WF1 Upper Subalpine Wildflower 9 <NA> A
## 4 4 125 SB2 Lower Subalpine Sawback 8 <NA> A
## 5 5 153 WF2 Upper Subalpine Wildflower 8 <NA> <NA>
## 6 6 110 PP6 Lower Subalpine Pipestone 7 <NA> B
## 7 7 127 SB4 Lower Subalpine Sawback 9 <NA> B
## 8 8 1 AL1 Lower Subalpine Altrude 3 <NA> B
## 9 9 152 WF1 Upper Subalpine Wildflower 8 <NA> A
## 10 10 70 HC4 Lower Subalpine Hector 3 c <NA>
## 11 11 108 PP3 Lower Subalpine Pipestone 7 <NA> <NA>
## 12 12 154 WF3 Upper Subalpine Wildflower 9 <NA> A
## 13 13 180 ZZ rock_water Water 0 <NA> <NA>
## 14 14 152 WF1 Upper Subalpine Wildflower 9 <NA> X
## 15 15 112 PR1 Lower Subalpine Panarama 7 c B
## 16 16 179 T Alpine Rock 0 <NA> <NA>
## 17 17 9 BK1 Lower Subalpine Baker_cr 6 c <NA>
## 18 18 153 WF2 Upper Subalpine Wildflower 8 <NA> <NA>
## 19 19 181 R+CR rock_water Rock 0 <NA> <NA>
## 20 20 180 ZZ rock_water Water 0 <NA> <NA>
## MODIFIER_2 DESCRIPTIO DEFINITION DESCRIPT0 EREG_NUM ESEC_NUM
## 1 <NA> BURNED > 70% <NA> 3 55
## 2 <NA> <NA> 45 - 70% <NA> 3 39
## 3 B SNOW AVALANCHED > 70% BURNED 3 55
## 4 <NA> SNOW AVALANCHED 45 - 70% <NA> 2 45
## 5 <NA> <NA> 45 - 70% <NA> 3 55
## 6 <NA> BURNED 30 - 45% <NA> 2 40
## 7 <NA> BURNED > 70% <NA> 2 45
## 8 <NA> BURNED 0 - 5% <NA> 2 1
## 9 <NA> SNOW AVALANCHED 45 - 70% <NA> 3 55
## 10 <NA> <NA> 0 - 5% <NA> 2 24
## 11 <NA> <NA> 30 - 45% <NA> 2 40
## 12 <NA> SNOW AVALANCHED > 70% <NA> 3 55
## 13 <NA> <NA> <NA> <NA> 7 59
## 14 <NA> LITHIC > 70% <NA> 3 55
## 15 <NA> BURNED 30 - 45% <NA> 2 41
## 16 <NA> <NA> <NA> <NA> 6 61
## 17 <NA> <NA> 15 - 30% <NA> 2 4
## 18 <NA> <NA> 45 - 70% <NA> 3 55
## 19 <NA> <NA> <NA> <NA> 7 61
## 20 <NA> <NA> <NA> <NA> 7 59
## DOMVEG DVEG_NUM VEGTYPE VTYP_NUM DOMSOIL
## 1 Open_Mixed_conifer 16 O4 53 Brunisol
## 2 Englemann_spruce/Subalpine_fir 3 C15 14 Brunisol
## 3 Englemann_spruce/Subalpine_fir 3 C15 14 Brunisol
## 4 Englemann_spruce/Subalpine_fir 3 C13 12 Brunisol
## 5 Salix_shrubland 22 S2 69 Brunisol
## 6 Open_Englemann_spruce/Subalpine_fir 4 O18 65 Regosolic
## 7 Lodgepole_pine 1 O4 53 Brunisol
## 8 Lodgepole_pine 1 C19 18 Non-soil
## 9 Englemann_spruce/Subalpine_fir 3 C15 14 Brunisol
## 10 Dwarf_birch_shrubland 5 S1 68 Gleysolic
## 11 Englemann_spruce/Subalpine_fir 3 C13 12 Regosolic
## 12 Open_Mixed_conifer 16 O4 53 Brunisol
## 13 Non_vegetated 29 NIL 111 Non-soil
## 14 Englemann_spruce/Subalpine_fir 3 C15 14 Brunisol
## 15 Lodgepole_pine 1 C20 19 Brunisol
## 16 Non_vegetated 29 NIL 111 Non-soil
## 17 Lodgepole_pine 1 C18 17 Brunisol
## 18 Salix_shrubland 22 S2 69 Brunisol
## 19 Non_vegetated 29 NIL 111 Non-soil
## 20 Non_vegetated 29 NIL 111 Non-soil
## DSOI_NUM CLS DCLS_NUM LANDFORM LFOR_NUM DEERHAB_W
## 1 1 <NA> 0 Colluvial 2 0
## 2 1 Moderately 4 Till 12 3
## 3 1 <NA> 0 Colluvial 2 0
## 4 1 <NA> 0 Colluvial 2 0
## 5 1 <NA> 0 Colluvial 2 0
## 6 6 Well 3 Fluvial 4 3
## 7 1 <NA> 0 Colluvial 2 0
## 8 1 Well 3 Fluvial 4 3
## 9 1 <NA> 0 Colluvial 2 0
## 10 5 Poorly 6 Fluvial 4 3
## 11 6 Well 3 Fluvial 4 3
## 12 1 <NA> 0 Colluvial 2 0
## 13 8 <NA> 0 Water 0 0
## 14 1 <NA> 0 Colluvial 2 0
## 15 1 Well 3 Till 12 4
## 16 8 <NA> 0 Talus 2 0
## 17 1 Well 3 morainal 8 3
## 18 1 <NA> 0 Colluvial 2 0
## 19 8 <NA> 0 Rock and Colluvial Rubble 13 0
## 20 8 <NA> 0 Water 0 0
## DEERHAB_S E00_CODE CODE0 VEGROUP CARI_W CARI_S DEER_W DEER_S MOOSE_W
## 1 0 011248WF3B/9 WF3B <NA> 1 1 3 1 3
## 2 2 011114PL1/8 PL1 <NA> 5 3 4 3 4
## 3 0 011223WF1AB/9 WF1AB <NA> 1 1 1 1 4
## 4 0 011198SB2A/8 SB2A <NA> 1 1 3 1 4
## 5 0 011230WF2/8 WF2 <NA> 3 5 4 3 3
## 6 2 011164PP6B/7 PP6B <NA> 4 4 4 3 5
## 7 0 011209SB4B/9 SB4B <NA> 1 1 4 1 3
## 8 4 011008AL1B/3 AL1 <NA> 1 1 4 5 4
## 9 0 011221WF1A/8 WF1A <NA> 1 1 1 1 4
## 10 3 011085HC4/3c HC4 <NA> 5 3 4 4 5
## 11 2 011160PP3/7 PP3 <NA> 3 1 4 3 5
## 12 0 011242WF3A/9 WF3A <NA> 1 1 3 1 3
## 13 0 011260ZZ ZZ <NA> 7 7 7 7 7
## 14 0 011226WF1X/9 WF1X <NA> 1 1 1 1 4
## 15 2 011172PR1B/7c PR1B <NA> 1 1 5 3 4
## 16 0 011214T T <NA> 1 1 1 1 1
## 17 4 011017BK1/6c BK1 <NA> 1 1 4 5 5
## 18 0 011230WF2/8 WF2 <NA> 3 5 4 3 3
## 19 0 011188R+CR R+CR <NA> 1 1 1 1 1
## 20 0 011260ZZ ZZ <NA> 7 7 7 7 7
## MOOSE_S ELK_W ELK_S GOAT_W GOAT_S SHEEP_W SHEEP_S WOLF_W WOLF_S COYOTE_W
## 1 4 3 3 6 5 5 5 3 3 3
## 2 4 4 4 3 3 4 4 4 4 4
## 3 4 1 4 4 1 3 3 3 4 3
## 4 4 4 4 4 1 3 4 4 3 4
## 5 4 3 3 6 5 5 5 4 4 4
## 6 4 5 5 1 3 3 3 5 5 5
## 7 1 4 3 5 4 5 4 4 3 4
## 8 3 4 5 3 1 1 1 4 5 4
## 9 4 1 4 4 1 3 3 3 4 3
## 10 5 5 5 3 3 3 3 5 5 5
## 11 4 4 4 3 3 3 3 4 4 4
## 12 4 3 3 6 5 5 5 3 3 3
## 13 7 7 7 7 7 7 7 7 7 7
## 14 4 1 4 4 1 3 3 3 4 3
## 15 3 3 3 3 1 3 1 5 3 4
## 16 1 1 1 3 3 1 1 1 1 1
## 17 5 4 5 1 1 3 3 4 5 5
## 18 4 3 3 6 5 5 5 4 4 4
## 19 1 1 1 4 3 1 1 1 1 1
## 20 7 7 7 7 7 7 7 7 7 7
## COYOTE_S MARTEN FISHER WEASEL MINK WOLVERINE COUGAR_W COUGAR_S LYNX PIKA
## 1 3 1 1 1 1 1 3 3 3 4
## 2 4 5 1 5 1 4 3 4 3 5
## 3 4 5 1 1 1 1 1 3 1 4
## 4 3 3 1 1 1 1 4 4 4 4
## 5 4 3 1 3 1 5 3 4 1 5
## 6 5 4 1 5 1 4 5 4 5 1
## 7 3 1 1 1 1 1 5 3 4 5
## 8 5 4 1 4 1 5 4 5 5 1
## 9 4 5 1 1 1 1 1 3 1 4
## 10 6 4 2 5 2 3 5 5 4 3
## 11 4 4 1 4 1 4 4 4 5 3
## 12 3 1 1 1 1 1 3 3 3 4
## 13 7 7 7 7 7 7 7 7 7 7
## 14 4 5 1 1 1 1 1 3 1 4
## 15 4 4 2 3 2 5 4 3 4 4
## 16 1 1 1 4 1 1 1 1 1 1
## 17 6 5 1 3 2 5 4 5 4 4
## 18 4 3 1 3 1 5 3 4 1 5
## 19 1 1 1 1 1 1 1 1 1 1
## 20 7 7 7 7 7 7 7 7 7 7
## HARE BEAVER PORCUPIN MARMOT CSQUIRR REDSQUIR SM_DEN BB_DENSITY BLKBR_BUFB
## 1 4 1 1 1 1 1 4 3 1
## 2 4 1 4 3 5 3 3 3 4
## 3 3 1 4 1 3 3 4 3 1
## 4 4 1 1 1 3 4 4 4 4
## 5 3 1 1 4 5 3 3 3 1
## 6 5 1 3 1 4 4 5 3 5
## 7 4 1 3 1 4 3 4 4 4
## 8 5 1 4 1 5 4 3 4 5
## 9 3 1 4 1 3 3 4 3 1
## 10 4 5 5 1 4 3 4 5 4
## 11 5 4 5 1 4 4 5 4 6
## 12 4 1 1 1 1 1 4 3 1
## 13 7 7 7 7 7 7 7 7 1
## 14 3 1 4 1 3 3 4 3 1
## 15 4 1 4 1 5 4 3 5 1
## 16 1 1 1 1 1 1 5 3 3
## 17 4 4 1 1 3 5 5 5 4
## 18 3 1 1 4 5 3 3 3 1
## 19 1 1 1 1 1 1 1 3 1
## 20 7 7 7 7 7 7 7 7 1
## G_BEAR_A
## 1 3
## 2 3
## 3 3
## 4 4
## 5 3
## 6 4
## 7 5
## 8 4
## 9 3
## 10 3
## 11 4
## 12 3
## 13 7
## 14 3
## 15 3
## 16 0
## 17 4
## 18 3
## 19 0
## 20 7
# look at the projection of the shapefile (note the use of "@" instead of "$")
elc_habitat@proj4string@projargs
## [1] "+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs"
wolfyht@proj4string@projargs
## [1] "+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs"
Sometimes we will need to change the projection of the shapefile to another projection (package rgdal). To change the projection we need to use the approrpiate epsg (Geodetic Parameter Dataset) code that represents your new projection.
Find a list of these spatial reference codes here: http://spatialreference.org/ref/epsg/
elc_habitat <- spTransform(elc_habitat, CRS("+proj=longlat +datum=NAD83 +no_defs +ellps=GRS80 +towgs84=0,0,0"))
## Warning: PROJ support is provided by the sf and terra packages among others
# check new projection in geographic coordinate system WGS84
elc_habitat@proj4string@projargs
## [1] "+proj=longlat +datum=NAD83 +no_defs"
# reset projection back to what is was previously
elc_habitat <- spTransform(elc_habitat, CRS("+proj=longlat +datum=NAD83 +no_defs +ellps=GRS80 +towgs84=0,0,0"))
# look at the projection of the shapefile
elc_habitat@proj4string@projargs
## [1] "+proj=longlat +datum=NAD83 +no_defs"
# another way to change it back to previous
elc_habitat <- spTransform(elc_habitat, CRS("+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs +ellps=GRS80 +towgs84=0,0,0"))
#write shapefile to files
writeOGR(elc_habitat,".","elc_habitat_NEW",driver="ESRI Shapefile", overwrite_layer=TRUE)
## Warning: OGR support is provided by the sf and terra packages among others
## Warning: OGR support is provided by the sf and terra packages among others
Spatial data objects have a spatial extent, in geographic space, that they occupy. To look at the spatial extent of the shapefile,we use ?extent()
extent(elc_habitat)
## class : Extent
## xmin : 443680.6
## xmax : 627727.9
## ymin : 5618405
## ymax : 5789236
extent(wolfyht)
## class : Extent
## xmin : 555853
## xmax : 605389
## ymin : 5656997
## ymax : 5741316
We will learn later how to harmonize datasets of different spatial extents.
par(mfrow= c(1,1)) ## reset graphical parameters
# reading in raster files (raster package)
deer_w<-raster("Data/deer_w2.tif")
moose_w<-raster("Data/moose_w2.tif") ## missing moose
elk_w<-raster("Data/elk_w2.tif")
sheep_w<-raster("Data/sheep_w2.tif") ## missing sheep
goat_w<-raster("Data/goat_w2.tif")
wolf_w<-raster("Data/wolf_w2.tif")#
elevation2<-raster("Data/Elevation2.tif") #resampled
disthumanaccess2<-raster("Data/DistFromHumanAccess2.tif") #resampled
# make a very basic plot of raster
plot(deer_w)
# look at the class of the raster
class(deer_w)
## [1] "RasterLayer"
## attr(,"package")
## [1] "raster"
# look at basic raster summary
deer_w
## class : RasterLayer
## dimensions : 5694, 6892, 39243048 (nrow, ncol, ncell)
## resolution : 30, 30 (x, y)
## extent : 443680.6, 650440.6, 5618416, 5789236 (xmin, xmax, ymin, ymax)
## crs : +proj=utm +zone=11 +datum=NAD83 +units=m +no_defs
## source : deer_w2.tif
## names : deer_w2
## values : 0, 7 (min, max)
# look at raster data
deer_w@data
## An object of class ".SingleLayerData"
## Slot "values":
## logical(0)
##
## Slot "offset":
## [1] 0
##
## Slot "gain":
## [1] 1
##
## Slot "inmemory":
## [1] FALSE
##
## Slot "fromdisk":
## [1] TRUE
##
## Slot "isfactor":
## [1] FALSE
##
## Slot "attributes":
## list()
##
## Slot "haveminmax":
## [1] TRUE
##
## Slot "min":
## [1] 0
##
## Slot "max":
## [1] 7
##
## Slot "band":
## [1] 1
##
## Slot "unit":
## [1] ""
##
## Slot "names":
## [1] "deer_w2"
# look at structure of raster
str(deer_w)
## Formal class 'RasterLayer' [package "raster"] with 12 slots
## ..@ file :Formal class '.RasterFile' [package "raster"] with 13 slots
## .. .. ..@ name : chr "C:\\Users\\Administrator.KJLWS11\\Documents\\Lab1\\Data\\deer_w2.tif"
## .. .. ..@ datanotation: chr "FLT8S"
## .. .. ..@ byteorder : chr "little"
## .. .. ..@ nodatavalue : num -Inf
## .. .. ..@ NAchanged : logi FALSE
## .. .. ..@ nbands : int 1
## .. .. ..@ bandorder : chr "BIL"
## .. .. ..@ offset : int 0
## .. .. ..@ toptobottom : logi TRUE
## .. .. ..@ blockrows : Named int 1
## .. .. .. ..- attr(*, "names")= chr "rows"
## .. .. ..@ blockcols : Named int 6892
## .. .. .. ..- attr(*, "names")= chr "cols"
## .. .. ..@ driver : chr "gdal"
## .. .. ..@ open : logi FALSE
## ..@ data :Formal class '.SingleLayerData' [package "raster"] with 13 slots
## .. .. ..@ values : logi(0)
## .. .. ..@ offset : num 0
## .. .. ..@ gain : num 1
## .. .. ..@ inmemory : logi FALSE
## .. .. ..@ fromdisk : logi TRUE
## .. .. ..@ isfactor : logi FALSE
## .. .. ..@ attributes: list()
## .. .. ..@ haveminmax: logi TRUE
## .. .. ..@ min : num 0
## .. .. ..@ max : num 7
## .. .. ..@ band : int 1
## .. .. ..@ unit : chr ""
## .. .. ..@ names : chr "deer_w2"
## ..@ legend :Formal class '.RasterLegend' [package "raster"] with 5 slots
## .. .. ..@ type : chr(0)
## .. .. ..@ values : logi(0)
## .. .. ..@ color : logi(0)
## .. .. ..@ names : logi(0)
## .. .. ..@ colortable: logi(0)
## ..@ title : chr(0)
## ..@ extent :Formal class 'Extent' [package "raster"] with 4 slots
## .. .. ..@ xmin: num 443681
## .. .. ..@ xmax: num 650441
## .. .. ..@ ymin: num 5618416
## .. .. ..@ ymax: num 5789236
## ..@ rotated : logi FALSE
## ..@ rotation:Formal class '.Rotation' [package "raster"] with 2 slots
## .. .. ..@ geotrans: num(0)
## .. .. ..@ transfun:function ()
## ..@ ncols : int 6892
## ..@ nrows : int 5694
## ..@ crs :Formal class 'CRS' [package "sp"] with 1 slot
## .. .. ..@ projargs: chr "+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs"
## .. .. ..$ comment: chr "PROJCRS[\"unknown\",\n BASEGEOGCRS[\"unknown\",\n DATUM[\"North American Datum 1983\",\n E"| __truncated__
## ..@ history : list()
## ..@ z : list()
# look at the projection of the raster (note the use of "@" instead of "$")
deer_w@crs@projargs
## [1] "+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs"
# look at the spatial extent of the raster
extent(deer_w)
## class : Extent
## xmin : 443680.6
## xmax : 650440.6
## ymin : 5618416
## ymax : 5789236
# look at the resolution of the raster
res(deer_w)
## [1] 30 30
And if we need to change the projection of the raster to another projection (package rgdal) - note for the Tutorial I’ve masked these out # because they take a LONG time.
#deer_w2 <- projectRaster(deer_w, crs="+init=epsg:4326")
## or try this the same way we did above
#deer_w2 <- projectRaster(deer_w, crs="+proj=longlat +datum=NAD83 +no_defs +ellps=GRS80 +towgs84=0,0,0")
#check projection of the raster
deer_w@crs@projargs
## [1] "+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs"
#change it back to what it was using another raster layers projection
#deer_w <- projectRaster(deer_w2, wolf_w)
#check projection of the raster
deer_w@crs@projargs
## [1] "+proj=utm +zone=11 +datum=NAD83 +units=m +no_defs"
One of the GREAT things about doing GIS in R is the ability to create a consistent raster stack of rasters of the same projection, extent, etc. LEts create a raster stack!
all.rasters<-stack(deer_w, moose_w, elk_w, sheep_w, goat_w, elevation2, disthumanaccess2)
plot(all.rasters)
#check class
class(all.rasters)
## [1] "RasterStack"
## attr(,"package")
## [1] "raster"
We can also sometimes need to export raster files to other formats such as a GeoTiff.
#learn more about
#?writeRaster()
writeRaster(deer_w, "Output/new_deer.tiff", "GTiff", overwrite=TRUE)
Learn about mapview here: https://r-spatial.github.io/mapview/articles/articles/mapview_02-advanced.html
And also learn about different map types here http://leaflet-extras.github.io/leaflet-providers/preview/
Note, I had unexplained errors with the latest version of Mapview that were creating all sorts of problems both viewing the wolf telemetry points, and, rendering the .html through R markdown. I found the solution here : https://github.com/r-spatial/mapview/issues/312
install.packages("remotes")
## package 'remotes' successfully unpacked and MD5 sums checked
##
## The downloaded binary packages are in
## C:\Users\Administrator.KJLWS11\AppData\Local\Temp\Rtmp84rxPr\downloaded_packages
library(remotes)
remotes::install_github("r-spatial/mapview")
## Downloading GitHub repo r-spatial/mapview@HEAD
## cli (3.5.0 -> 3.6.0 ) [CRAN]
## raster (3.6-11 -> 3.6-13) [CRAN]
## htmlwidgets (1.6.0 -> 1.6.1 ) [CRAN]
## svglite (2.1.0 -> 2.1.1 ) [CRAN]
## Installing 4 packages: cli, raster, htmlwidgets, svglite
## Warning: package 'raster' is in use and will not be installed
##
## There are binary versions available but the source versions are later:
## binary source needs_compilation
## cli 3.5.0 3.6.0 TRUE
## svglite 2.1.0 2.1.1 TRUE
##
## Binaries will be installed
## package 'cli' successfully unpacked and MD5 sums checked
## Warning: cannot remove prior installation of package 'cli'
## Warning in file.copy(savedcopy, lib, recursive = TRUE): problem copying
## C:\Program Files\R\R-4.2.2\library\00LOCK\cli\libs\x64\cli.dll to C:\Program
## Files\R\R-4.2.2\library\cli\libs\x64\cli.dll: Permission denied
## Warning: restored 'cli'
## package 'htmlwidgets' successfully unpacked and MD5 sums checked
## package 'svglite' successfully unpacked and MD5 sums checked
##
## The downloaded binary packages are in
## C:\Users\Administrator.KJLWS11\AppData\Local\Temp\Rtmp84rxPr\downloaded_packages
## Running `R CMD build`...
## * checking for file 'C:\Users\Administrator.KJLWS11\AppData\Local\Temp\Rtmp84rxPr\remotes7bc8424375d2\r-spatial-mapview-64f865a/DESCRIPTION' ... OK
## * preparing 'mapview':
## * checking DESCRIPTION meta-information ... OK
## * checking for LF line-endings in source and make files and shell scripts
## * checking for empty or unneeded directories
## * building 'mapview_2.11.0.9005.tar.gz'
library(mapview)
##
## Attaching package: 'mapview'
## The following object is masked _by_ '.GlobalEnv':
##
## trails
mapView(wolfyht, layer.name = , zcol="NAME", native.crs = FALSE, legend = TRUE, cex=5, lwd=2, map.type = "Esri.DeLorme") + mcp2
mapView(wolfyht, zcol="NAME", legend = TRUE, cex=5, lwd=2, map.type = "Esri.WorldImagery") + mcp2
## Open Topo Map
mapView(wolfyht, zcol="NAME", legend = TRUE, cex=5, lwd=2, map.type = "OpenTopoMap") + mcp2
## Esri World Topo Map
mapView(wolfyht, zcol="NAME", legend = TRUE, cex=5, lwd=2, map.type = "Esri.WorldTopoMap") + mcp2